-
Notifications
You must be signed in to change notification settings - Fork 524
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(pt): add finetune_head
to argcheck
#3967
Conversation
WalkthroughWalkthroughRecent changes enhance the fine-tuning mechanism within the Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant TestMultitask
participant FineTuneUtils
participant ArgCheckUtils
User->>TestMultitask: Initiate multi-task fine-tuning
TestMultitask->>ArgCheckUtils: Normalize and update config
ArgCheckUtils-->>TestMultitask: Return updated config
TestMultitask->>FineTuneUtils: Call get_finetune_rules with config
FineTuneUtils-->>TestMultitask: Return fine-tune rules
TestMultitask->>User: Provide fine-tune rules and results
Note right of FineTuneUtils: Fine-tune head key handling
Recent review detailsConfiguration used: CodeRabbit UI Files selected for processing (1)
Additional comments not posted (2)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## devel #3967 +/- ##
=======================================
Coverage 82.82% 82.83%
=======================================
Files 520 522 +2
Lines 50869 50867 -2
Branches 3020 3015 -5
=======================================
+ Hits 42134 42135 +1
+ Misses 7798 7797 -1
+ Partials 937 935 -2 ☔ View full report in Codecov by Sentry. |
Why isn't this option in the argcheck? |
This option works only to get finetune rules before argcheck, see https://github.com/deepmodeling/deepmd-kit/blob/devel/deepmd/pt/entrypoints/main.py#L253 . After this, the option is useless. We can deleted it from the model definition or keep a placeholder in argcheck, which one do we prefer? @njzjz @wanghan-iapcm |
We use argcheck to generate the documentation, so all user-defined arguments should be in the argcheck, i.e. in the automatically generated documentation. |
finetune_head
to argcheck
Add `finetune_head` to argcheck. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **New Features** - Introduced a new `finetune_head` argument for specifying the fitting net during multi-task fine-tuning, with optional random initialization if not set. - **Bug Fixes** - Improved handling for specific conditions by automatically removing the "finetune_head" key from the configuration. - **Tests** - Updated multitask training and finetuning tests to include new configuration manipulations. - Removed the `_comment` field from test configuration files to ensure cleaner test setups. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
Add
finetune_head
to argcheck.Summary by CodeRabbit
New Features
finetune_head
argument for specifying the fitting net during multi-task fine-tuning, with optional random initialization if not set.Bug Fixes
Tests
_comment
field from test configuration files to ensure cleaner test setups.